@pexip/media-processor
A library for media analysis using Web APIs.
Enumerations
Interfaces
- Analyzer
- AnalyzerSubscribableOptions
- AsyncAssets
- AudioGraph
- AudioGraphOptions
- AudioNodeInit
- AudioNodeProps
- AudioStats
- Benchmark
- Clock
- Denoise
- Detector
- Frame
- Gain
- Mask
- Point
- Process
- RenderParams
- Runner
- Segmentation
- SegmentationParams
- Segmenter
- Size
- StatsOptions
- SubscribableOptions
- ThrottleOptions
- Transform
- VideoProcessor
- WorkletMessagePortOptions
- WorkletModule
Type Aliases
AnalyzerNodeInit
Ƭ AnalyzerNodeInit: AudioNodeInit
<AnalyserNode
, Analyzer
>
AsyncCallback
Ƭ AsyncCallback: () => Promise
<void
>
Type declaration
▸ (): Promise
<void
>
Returns
Promise
<void
>
AudioBufferBytes
Ƭ AudioBufferBytes: Uint8Array
Same as the return from AnalyserNode.getByteFrequencyData()
AudioBufferFloats
Ƭ AudioBufferFloats: Float32Array
Same as AudioBuffer Or the return from AnalyserNode.getFloatFrequencyData()
AudioDestinationNodeInit
Ƭ AudioDestinationNodeInit: AudioNodeInit
<AudioDestinationNode
, AudioDestinationNode
>
AudioNodeConnectParam
Ƭ AudioNodeConnectParam: ConnectParamBase
<ConnectParamType
>
AudioNodeInitConnectParam
Ƭ AudioNodeInitConnectParam: ConnectParamBase
<ConnectInitParamType
>
AudioNodeInitConnection
Ƭ AudioNodeInitConnection: AudioNodeInitConnectParam
[]
AudioNodeInitConnections
Ƭ AudioNodeInitConnections: AudioNodeInitConnection
[]
AudioNodeParam
Ƭ AudioNodeParam: BaseAudioNode
| AudioParam
AudioSamples
Ƭ AudioSamples: number
[] | AudioBufferFloats
| AudioBufferBytes
Audio samples from each channel, either in float or bytes form
Example
| | | Sample Frame 1 | Sample Frame 2 | Sample Frame 3 |
| Input | Channel L: | sample 1 | sample 2 | sample 3 |
| | Channel R: | sample 1 | sample 2 | sample 3 |
We can get 2 AudioSamples
from "Channel L" and "Channel R" from "Input"
BaseAudioNode
Ƭ BaseAudioNode: Pick
<AudioNode
, NodeConnectionAction
>
Callback
Ƭ Callback<R
, T
>: (...params
: T
) => R
Type parameters
Name | Type |
---|---|
R | R |
T | extends unknown [] |
Type declaration
▸ (...params
): R
Parameters
Name | Type |
---|---|
...params | T |
Returns
R
Canvas
Ƭ Canvas: HTMLCanvasElement
| OffscreenCanvas
CanvasContext
Ƭ CanvasContext: CanvasRenderingContext2D
| OffscreenCanvasRenderingContext2D
ChannelSplitterNodeInit
Ƭ ChannelSplitterNodeInit: AudioNodeInit
<ChannelSplitterNode
, ChannelSplitterNode
>
Color
Ƭ Color: Object
Type declaration
Name | Type |
---|---|
a | number |
b | number |
g | number |
r | number |
ConnectInitParamBaseType
Ƭ ConnectInitParamBaseType: AudioParam
| AudioNodeInit
ConnectInitParamType
Ƭ ConnectInitParamType: ConnectInitParamBaseType
| undefined
ConnectParamBase
Ƭ ConnectParamBase<T
>: [T
, T
extends undefined
? undefined
: AudioNodeOutputIndex
, T
extends undefined
? undefined
: AudioNodeInputIndex
] | T
Type parameters
Name | Type |
---|---|
T | extends ConnectParamType | ConnectInitParamType |
ConnectParamBaseType
Ƭ ConnectParamBaseType: AudioParam
| BaseAudioNode
ConnectParamType
Ƭ ConnectParamType: ConnectParamBaseType
| undefined
DelayNodeInit
Ƭ DelayNodeInit: AudioNodeInit
<DelayNode
, DelayNode
>
DenoiseWorkletNodeInit
Ƭ DenoiseWorkletNodeInit: AudioNodeInit
<AudioWorkletNode
>
GainNodeInit
Ƭ GainNodeInit: AudioNodeInit
<GainNode
, Gain
>
ImageType
Ƭ ImageType: CanvasImageSource
| ProcessInputType
| VideoFrame
InputFrame
Ƭ InputFrame: CanvasImageSource
| VideoFrame
IsVoice
Ƭ IsVoice<T
>: (data
: T
) => boolean
Type parameters
Name |
---|
T |
Type declaration
▸ (data
): boolean
Parameters
Name | Type |
---|---|
data | T |
Returns
boolean
MaskUnderlyingType
Ƭ MaskUnderlyingType: "canvasimagesource"
| "imagedata"
| "tensor"
Interfaces from "tensorflow-models/body-segmentation" interfaces
MediaElementAudioSourceNodeInit
Ƭ MediaElementAudioSourceNodeInit: AudioNodeInit
<MediaElementAudioSourceNode
, MediaElementAudioSourceNode
>
MediaStreamAudioDestinationNodeInit
Ƭ MediaStreamAudioDestinationNodeInit: AudioNodeInit
<MediaStreamAudioDestinationNode
, MediaStreamAudioDestinationNode
>
MediaStreamAudioSourceNodeInit
Ƭ MediaStreamAudioSourceNodeInit: AudioNodeInit
<MediaStreamAudioSourceNode
, MediaStreamAudioSourceNode
>
Node
Ƭ Node: AudioNode
| AudioParam
NodeConnectionAction
Ƭ NodeConnectionAction: "connect"
| "disconnect"
Nodes
Ƭ Nodes: Node
[]
ProcessInputType
Ƭ ProcessInputType: ImageData
| HTMLVideoElement
| HTMLImageElement
| OffscreenCanvas
| HTMLCanvasElement
| ImageBitmap
ProcessStatus
Ƭ ProcessStatus: "created"
| "opened"
| "opening"
| "processing"
| "idle"
| "closed"
| "destroying"
| "destroyed"
ProcessVideoTrack
Ƭ ProcessVideoTrack: (track
: MediaStreamVideoTrack
, transformers
: Transformer
<InputFrame
, InputFrame
>[], options?
: Options
) => Promise
<Track
>
Type declaration
▸ (track
, transformers
, options?
): Promise
<Track
>
Parameters
Name | Type |
---|---|
track | MediaStreamVideoTrack |
transformers | Transformer <InputFrame , InputFrame >[] |
options? | Options |
Returns
Promise
<Track
>
Rect
RenderEffects
Ƭ RenderEffects: typeof RENDER_EFFECTS
[number
]
RunnerCreator
Ƭ RunnerCreator<P
, R
>: (callback
: Callback
<R
, P
>, frameRate
: number
) => Runner
<P
>
Type parameters
Name | Type |
---|---|
P | extends unknown [] |
R | R |
Type declaration
▸ (callback
, frameRate
): Runner
<P
>
Parameters
Name | Type |
---|---|
callback | Callback <R , P > |
frameRate | number |
Returns
Runner
<P
>
SegmentationModel
Ƭ SegmentationModel: typeof SEG_MODELS
[number
]
SegmentationTransform
Ƭ SegmentationTransform: Transform
<InputFrame
, InputFrame
> & SegmentationParams
& Omit
<Process
, "open"
>
UniversalAudioContextState
Ƭ UniversalAudioContextState: AudioContextState
| "interrupted"
We need to add the missing type def to work with AudioContextState in Safari See https://developer.mozilla.org/en-US/docs/Web/API/BaseAudioContext/state#resuming_interrupted_play_states_in_ios_safari
Unsubscribe
Ƭ Unsubscribe: () => void
Type declaration
▸ (): void
Unsubscribe the subscription
Returns
void
WasmPaths
Ƭ WasmPaths: [string
, string
| undefined
]
Variables
BACKGROUND_BLUR_AMOUNT
• Const
BACKGROUND_BLUR_AMOUNT: 3
CLIP_COUNT_THRESHOLD
• Const
CLIP_COUNT_THRESHOLD: 6
Default clipping count threshold Number of consecutive clipThreshold level samples that indicate clipping.
CLIP_THRESHOLD
• Const
CLIP_THRESHOLD: 0.98
Default clipping detection threshold
EDGE_BLUR_AMOUNT
• Const
EDGE_BLUR_AMOUNT: 3
FLIP_HORIZONTAL
• Const
FLIP_HORIZONTAL: false
FOREGROUND_THRESHOLD
• Const
FOREGROUND_THRESHOLD: 0.5
FRAME_RATE
• Const
FRAME_RATE: 20
LOW_VOLUME_THRESHOLD
• Const
LOW_VOLUME_THRESHOLD: -60
Default low volume detection threshold
MONO_THRESHOLD
• Const
MONO_THRESHOLD: number
Default mono detection threshold Data must be identical within one LSB 16-bit to be identified as mono.
PROCESSING_HEIGHT
• Const
PROCESSING_HEIGHT: 432
PROCESSING_WIDTH
• Const
PROCESSING_WIDTH: 768
RENDER_EFFECTS
• Const
RENDER_EFFECTS: readonly ["none"
, "blur"
, "overlay"
]
SEG_MODELS
• Const
SEG_MODELS: readonly ["mediapipeSelfie"
]
SILENT_THRESHOLD
• Const
SILENT_THRESHOLD: number
Default silent threshold At least one LSB 16-bit data (compare is on absolute value).
VOICE_PROBABILITY_THRESHOLD
• Const
VOICE_PROBABILITY_THRESHOLD: 0.3
Default Voice probability threshold
urls
• Const
urls: Object
Type declaration
Name | Type |
---|---|
denoise | () => URL |
Functions
avg
▸ avg(nums
): number
Average an array of numbers
Parameters
Name | Type | Description |
---|---|---|
nums | number [] | An array of numbers |
Returns
number
calculateDistance
▸ calculateDistance(p1
, p2
): number
Calculate the distance between two Points
Parameters
Name | Type | Description |
---|---|---|
p1 | Point | Point 1 |
p2 | Point | Point 2 |
Returns
number
calculateFps
▸ calculateFps(time
): number
Parameters
Name | Type |
---|---|
time | number |
Returns
number
closedCurve
▸ closedCurve(«destructured»
): (data
: Point
[]) => string
Create a cubic Bezier curve path then turning back to the starting point with provided point of reference
Example
closedCurve({x:0, y:20})([{x:0, y:0}, {x:3, y:4}, {x:9, y:16}]);
// Output:
// M 0,0 C 0,0 1.778263374435667,1.8280237767745193 3,4 C 6.278263374435667,9.828023776774518 9,16 9,16 V 20 H 0 Z
Parameters
Name | Type |
---|---|
«destructured» | Point |
Returns
fn
▸ (data
): string
Parameters
Name | Type |
---|---|
data | Point [] |
Returns
string
copyByteBufferToFloatBuffer
▸ copyByteBufferToFloatBuffer(bytes
, floats
): void
Copy data from Uint8Array buffer to Float32Array buffer with byte to float conversion
Parameters
Name | Type | Description |
---|---|---|
bytes | Uint8Array | The source Byte buffer |
floats | Float32Array | The destination buffer |
Returns
void
createAnalyzerGraphNode
▸ createAnalyzerGraphNode(options?
): AudioNodeInit
<AnalyserNode
, Analyzer
>
Create an AnalyzerNodeInit
See
AnalyserOptions
Parameters
Name | Type |
---|---|
options? | AnalyserOptions |
Returns
AudioNodeInit
<AnalyserNode
, Analyzer
>
createAnalyzerSubscribableGraphNode
▸ createAnalyzerSubscribableGraphNode(«destructured»
): AudioNodeInit
<AnalyserNode
, Analyzer
>
Create an analyzer node with push-based subscription
Parameters
Name | Type |
---|---|
«destructured» | AnalyzerSubscribableOptions & AnalyserOptions |
Returns
AudioNodeInit
<AnalyserNode
, Analyzer
>
createAsyncCallbackLoop
▸ createAsyncCallbackLoop<P
, R
>(callback
, frameRate
, «destructured»?
): Object
Create an async callback loop to be called recursively with delay based on
the frameRate
Type parameters
Name | Type |
---|---|
P | extends unknown [] |
R | extends Promise <unknown , R > |
Parameters
Name | Type | Description |
---|---|---|
callback | Callback <R , P > | The callback to be invoked |
frameRate | number | The rate to be expected to invoke the callback |
«destructured» | Partial <AsyncCallbackLoopOptions > | - |
Returns
Object
Name | Type |
---|---|
start | (...params : P ) => Promise <void > |
stop | () => void |
get frameRate() | number |
createAudioContext
▸ createAudioContext(options?
): AudioContext
A function to create AudioContext
using constructor or factory function
depends on the browser supports
See
AudioContextOptions
Parameters
Name | Type |
---|---|
options? | AudioContextOptions |
Returns
AudioContext
createAudioDestinationGraphNode
▸ createAudioDestinationGraphNode(): AudioNodeInit
<AudioDestinationNode
, AudioDestinationNode
>
Create an AudioDestinationNode
Returns
AudioNodeInit
<AudioDestinationNode
, AudioDestinationNode
>
createAudioGraph
▸ createAudioGraph(initialConnections
, options?
): AudioGraph
Accepts AudioNodeInitConnections to build the audio graph within a signal audio context
See
Example
const source = createStreamSourceGraphNode(stream);
const analyzer = createAnalyzerGraphNode({fftSize});
const audioGraph = createAudioGraph([[source, analyzer]]);
Parameters
Name | Type | Description |
---|---|---|
initialConnections | AudioNodeInitConnections | A list of AudioNodeInit to build the graph in a linear fashion |
options | AudioGraphOptions |
Returns
createAudioGraphProxy
▸ createAudioGraphProxy(audioGraph
, handlers
): AudioGraph
Parameters
Name | Type |
---|---|
audioGraph | AudioGraph |
handlers | AudioGraphProxyHandlers |
Returns
createAudioSignalDetector
▸ createAudioSignalDetector(shouldDetect
, onDetected
): (buffer
: Queue
<number
[]>, threshold?
: number
) => (samples
: number
[]) => void
Create a function to process the AudioStats and check if silent
onSignalDetected
callback is called under 2 situations:
Logic
lastCheck | silent | should call onSignalDetected
0 | 0 | 0
0 | 1 | 1
1 | 0 | 1
1 | 1 | 0
Parameters
Name | Type |
---|---|
shouldDetect | () => boolean |
onDetected | (silent : boolean ) => void |
Returns
fn
▸ (buffer
, threshold?
): (samples
: number
[]) => void
Parameters
Name | Type |
---|---|
buffer | Queue <number []> |
threshold? | number |
Returns
fn
▸ (samples
): void
Parameters
Name | Type |
---|---|
samples | number [] |
Returns
void
createAudioStats
▸ createAudioStats(stats?
, options?
): AudioStats
AudioStats builder
Parameters
Name | Type | Description |
---|---|---|
stats | Partial <AudioStats > | overwrite the default attributes |
options | Object | silentThreshold , lowVolumeThreshold and clipCountThreshold |
options.clipCountThreshold? | number | - |
options.lowVolumeThreshold? | number | - |
options.silentThreshold? | number | - |
Returns
createBenchmark
▸ createBenchmark(clock?
, «destructured»?
): Benchmark
Parameters
Name | Type | Default value |
---|---|---|
clock | Clock | performance |
«destructured» | BenchmarkOptions | {} |
Returns
createCanvasTransform
▸ createCanvasTransform(segmenter
, «destructured»?
): SegmentationTransform
Parameters
Name | Type |
---|---|
segmenter | Segmenter |
«destructured» | Partial <Options > |
Returns
createChannelMergerGraphNode
▸ createChannelMergerGraphNode(options?
): AudioNodeInit
<ChannelMergerNode
, ChannelMergerNode
>
Create a ChannelMergerNode
See
ChannelMergerOptions
Parameters
Name | Type |
---|---|
options? | ChannelMergerOptions |
Returns
AudioNodeInit
<ChannelMergerNode
, ChannelMergerNode
>
createChannelSplitterGraphNode
▸ createChannelSplitterGraphNode(options?
): AudioNodeInit
<ChannelSplitterNode
, ChannelSplitterNode
>
Create a ChannelSplitterNode
See
ChannelSplitterOptions
Parameters
Name | Type |
---|---|
options? | ChannelSplitterOptions |
Returns
AudioNodeInit
<ChannelSplitterNode
, ChannelSplitterNode
>
createDelayGraphNode
▸ createDelayGraphNode(options?
): AudioNodeInit
<DelayNode
, DelayNode
>
Create a DelayNode
See
DelayOptions
Parameters
Name | Type |
---|---|
options? | DelayOptions |
Returns
AudioNodeInit
<DelayNode
, DelayNode
>
createDenoiseWorkletGraphNode
▸ createDenoiseWorkletGraphNode(data
, messageHandler?
): AudioNodeInit
<AudioWorkletNode
, AudioWorkletNode
>
Create a noise suppression node
Parameters
Name | Type | Description |
---|---|---|
data | BufferSource | WebAssembly source |
messageHandler? | (vads : number []) => void | - |
Returns
AudioNodeInit
<AudioWorkletNode
, AudioWorkletNode
>
createFrameCallbackRequest
▸ createFrameCallbackRequest(callback
, frameRate
, «destructured»?
): Object
Create a callback loop for video frame processing using
requestVideoFrameCallback
under-the-hood when available otherwise our
fallback implementation based on setTimeout
.
Parameters
Name | Type | Description |
---|---|---|
callback | Callback <Promise <void >, [ProcessInputType ]> | To be called by the loop |
frameRate | number | A fallback frame rate when we are not able to get the rate from API |
«destructured» | FrameCallbackRequestOptions | - |
Returns
Object
Name | Type |
---|---|
start | (input : ProcessInputType ) => Promise <void > |
stop | () => void |
get frameRate() | number |
createGainGraphNode
▸ createGainGraphNode(mute
): AudioNodeInit
<GainNode
, Gain
>
Create a GainNodeInit
Parameters
Name | Type | Description |
---|---|---|
mute | boolean | initial mute state |
Returns
AudioNodeInit
<GainNode
, Gain
>
createMediaElementSourceGraphNode
▸ createMediaElementSourceGraphNode(mediaElement
): AudioNodeInit
<MediaElementAudioSourceNode
, MediaElementAudioSourceNode
>
Create a MediaStreamAudioSourceNodeInit
Parameters
Name | Type |
---|---|
mediaElement | HTMLMediaElement |
Returns
AudioNodeInit
<MediaElementAudioSourceNode
, MediaElementAudioSourceNode
>
createMediapipeSegmenter
▸ createMediapipeSegmenter(basePath?
, «destructured»?
): Segmenter
Parameters
Name | Type | Default value |
---|---|---|
basePath | string | '/' |
«destructured» | Partial <Options > | {} |
Returns
createStreamDestinationGraphNode
▸ createStreamDestinationGraphNode(options?
): AudioNodeInit
<MediaStreamAudioDestinationNode
, MediaStreamAudioDestinationNode
>
Create a MediaStreamAudioDestinationNodeInit
Parameters
Name | Type |
---|---|
options? | AudioNodeOptions |
Returns
AudioNodeInit
<MediaStreamAudioDestinationNode
, MediaStreamAudioDestinationNode
>
createStreamSourceGraphNode
▸ createStreamSourceGraphNode(mediaStream
, shouldResetEnabled?
): AudioNodeInit
<MediaStreamAudioSourceNode
, MediaStreamAudioSourceNode
>
Create a MediaStreamAudioSourceNodeInit
Parameters
Name | Type | Default value | Description |
---|---|---|---|
mediaStream | MediaStream | undefined | Source MediaStream |
shouldResetEnabled | boolean | true | Whether or not to enable the cloned track |
Returns
AudioNodeInit
<MediaStreamAudioSourceNode
, MediaStreamAudioSourceNode
>
createVADetector
▸ createVADetector(onDetected
, shouldDetect
, options?
): <T>(isVoice
: IsVoice
<T
>) => (data
: T
) => void
Create a voice detector based on provided params
See
ThrottleOptions
Parameters
Name | Type | Description |
---|---|---|
onDetected | () => void | When there is voice activity, this callback will be called |
shouldDetect | () => boolean | When return true , voice activity will function, otherwise, not function |
options? | ThrottleOptions |
Returns
fn
▸ <T
>(isVoice
): (data
: T
) => void
Type parameters
Name |
---|
T |
Parameters
Name | Type |
---|---|
isVoice | IsVoice <T > |
Returns
fn
▸ (data
): void
Parameters
Name | Type |
---|---|
data | T |
Returns
void
createVideoProcessor
▸ createVideoProcessor(transformers
, processTrack
): VideoProcessor
Parameters
Name | Type |
---|---|
transformers | Transform <InputFrame , InputFrame >[] |
processTrack | ProcessVideoTrack |
Returns
createVideoTrackProcessor
▸ createVideoTrackProcessor(): ProcessVideoTrack
Returns
createVideoTrackProcessorWithFallback
▸ createVideoTrackProcessorWithFallback(«destructured»?
): ProcessVideoTrack
Parameters
Name | Type |
---|---|
«destructured» | FallbackOptions |
Returns
createVoiceDetectorFromProbability
▸ createVoiceDetectorFromProbability(voiceThreshold?
): IsVoice
<number
>
A function to check the provided probability is considered as voice activity
Parameters
Name | Type | Default value | Description |
---|---|---|---|
voiceThreshold | number | VOICE_PROBABILITY_THRESHOLD | A threshold of the probability to be considered as voice activity |
Returns
IsVoice
<number
>
createVoiceDetectorFromTimeData
▸ createVoiceDetectorFromTimeData(options?
): IsVoice
<number
[]>
A function to check provided time series data is considered as voice activity
See
VAOptions
Parameters
Name | Type |
---|---|
options | VAOptions |
Returns
IsVoice
<number
[]>
curve
▸ curve(data
): string
Create a cubic Bezier curve path command
Example
curve([{x:0, y:0}, {x:3, y:4}, {x:9, y:16}]);
// Output:
// M 0,0 C 0,0 1.778263374435667,1.8280237767745193 3,4 C 6.278263374435667,9.828023776774518 9,16 9,16
Parameters
Name | Type | Description |
---|---|---|
data | Point [] | An array of Points |
Returns
string
fitDestinationSize
▸ fitDestinationSize(sw
, sh
, dw
, dh
): Rect
Convert the source size to destination size when necessary based on the height
Parameters
Name | Type | Description |
---|---|---|
sw | number | Source width |
sh | number | Source height |
dw | number | destination width |
dh | number | destination height |
Returns
fromByteToFloat
▸ fromByteToFloat(value
): number
Convert a byte to float, according to web audio spec
Floating point audio sample number is defined as: non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1 and +1, that is, 32bits floating point buffer, with each samples between -1.0 and 1.0 https://developer.mozilla.org/en-US/docs/Web/API/AudioBuffer
Byte samples are represented as follows: 128 is silence, 0 is negative max, 256 is positive max
Remarks
Ref. https://www.w3.org/TR/webaudio/#dom-analysernode-getbytetimedomaindata
Parameters
Name | Type | Description |
---|---|---|
value | number | The byte value to convert to float |
Returns
number
fromFloatToByte
▸ fromFloatToByte(value
): number
Convert a float to byte, according to web audio spec
Floating point audio sample number is defined as: non-interleaved IEEE754 32-bit linear PCM with a nominal range between -1 and +1, that is, 32bits floating point buffer, with each samples between -1.0 and 1.0 https://developer.mozilla.org/en-US/docs/Web/API/AudioBuffer
Byte samples are represented as follows: 128 is silence, 0 is negative max, 256 is positive max
Remarks
Ref. https://www.w3.org/TR/webaudio/#dom-analysernode-getbytetimedomaindata
Parameters
Name | Type | Description |
---|---|---|
value | number | The float value to convert to byte |
Returns
number
getAudioStats
▸ getAudioStats(options
): AudioStats
Calculate the audio stats, expected the samples are in float form
Remarks
http://www.rossbencina.com/code/real-time-audio-programming-101-time-waits-for-nothing
Parameters
Name | Type | Description |
---|---|---|
options | StatsOptions | See StatsOptions |
Returns
getBezierCurveControlPoints
▸ getBezierCurveControlPoints(«destructured»
): [Point
, Point
]
Spline Interpolation for Bezier Curve
Remarks
Ref. http://scaledinnovation.com/analytics/splines/aboutSplines.html Alt. https://www.particleincell.com/2012/bezier-splines/
Parameters
Name | Type |
---|---|
«destructured» | Object |
› p1 | Point |
› p2 | Point |
› p3 | Point |
› t | number |
Returns
isAnalyzerNodeInit
▸ isAnalyzerNodeInit(t
): t is AnalyzerNodeInit
Parameters
Name | Type |
---|---|
t | unknown |
Returns
t is AnalyzerNodeInit
isAudioNode
▸ isAudioNode(t
): t is AudioNode
Parameters
Name | Type |
---|---|
t | unknown |
Returns
t is AudioNode
isAudioNodeInit
▸ isAudioNodeInit(t
): t is AudioNodeInit<AudioNode, BaseAudioNode>
Parameters
Name | Type |
---|---|
t | unknown |
Returns
t is AudioNodeInit<AudioNode, BaseAudioNode>
isAudioParam
▸ isAudioParam(t
): t is AudioParam
Parameters
Name | Type |
---|---|
t | unknown |
Returns
t is AudioParam
isClipping
▸ isClipping(clipCount
, threshold?
): boolean
Check if there is clipping
Parameters
Name | Type | Default value | Description |
---|---|---|---|
clipCount | number | undefined | Number of consecutive clip |
threshold | number | CLIP_COUNT_THRESHOLD | - |
Returns
boolean
true
if the clipCount
is above the threshold, aka clipping
isEqualSize
▸ isEqualSize(widthA
, heightA
, widthB
, heightB
): boolean
Compare the provided width and height to see if they are the same
Parameters
Name | Type | Description |
---|---|---|
widthA | number | The width of A |
heightA | number | The height of A |
widthB | number | The width of B |
heightB | number | The height of B |
Returns
boolean
isLowVolume
▸ isLowVolume(gain
, threshold?
): boolean
Check if the provided gain above the low volume threshold, which is considered as low volume.
Parameters
Name | Type | Default value | Description |
---|---|---|---|
gain | number | undefined | Floating point representation of the gain number |
threshold | number | LOW_VOLUME_THRESHOLD | - |
Returns
boolean
true
if the gain
is lower than the threshold
isMono
▸ isMono(channels
, threshold?
): boolean
Check if provided channels are mono or stereo
Default Value
1.0 / 32767
Parameters
Name | Type | Default value | Description |
---|---|---|---|
channels | AudioSamples [] | undefined | Audio channels and assuming the inputs are in floating point form |
threshold | number | MONO_THRESHOLD | Mono detection threshold, default to floating point form |
Returns
boolean
true
if they are mono, otherwise stereo
isRenderEffects
▸ isRenderEffects(t
): t is "blur" | "none" | "overlay"
Parameters
Name | Type |
---|---|
t | unknown |
Returns
t is "blur" | "none" | "overlay"
isSegmentationModel
▸ isSegmentationModel(t
): t is "mediapipeSelfie"
Parameters
Name | Type |
---|---|
t | unknown |
Returns
t is "mediapipeSelfie"
isSilent
▸ isSilent(samples
, threshold?
): boolean
Simple silent detection to only check the first and last bit from the sample
Default Value
1.0 / 32767
assuming the sample is float value
Parameters
Name | Type | Default value | Description |
---|---|---|---|
samples | AudioSamples | undefined | Audio sample data, this could be in a form of floating number of a byte number as long as the threshold value is given accordingly. |
threshold | number | SILENT_THRESHOLD | Silent threshold |
Returns
boolean
true
when it is silent
isVoiceActivity
▸ isVoiceActivity(options?
): (volume
: number
) => boolean
A Naive Voice activity detection
Parameters
Name | Type | Description |
---|---|---|
options | VAOptions | See VAOptions |
Returns
fn
(volume: number) => boolean
, true
if there is voice
▸ (volume
): boolean
Parameters
Name | Type |
---|---|
volume | number |
Returns
boolean
line
▸ line(data
): string
Create a straight line path command
Example
line([{x:0, y:0}, {x:2, y:2}]);
// Output:
// M 0,0 L 2,2
Parameters
Name | Type | Description |
---|---|---|
data | Point [] | An array of Points |
Returns
string
loadScript
▸ loadScript(path
, id
): Promise
<void
>
Parameters
Name | Type |
---|---|
path | string |
id | string |
Returns
Promise
<void
>
loadTfjsBackendWebGl
▸ loadTfjsBackendWebGl(): Promise
<{ GPGPUContext
: ; MathBackendWebGL
: ; gpgpu_util
: ; setWebGLContext
: ; version_webgl
: ; webgl_util
: ; default
: { gpgpu_util
: { bindVertexProgramAttributeStreams
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, vertexBuffer
: WebGLBuffer
) => boolean
; createBufferFromOutputTexture
: (gl2
: WebGL2RenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => WebGLBuffer
; createFloat16MatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createFloat16PackedMatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createFloat32MatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createIndexBuffer
: (gl
: WebGLRenderingContext
) => WebGLBuffer
; createPackedMatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createUnsignedBytesMatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createVertexBuffer
: (gl
: WebGLRenderingContext
) => WebGLBuffer
; createVertexShader
: (gl
: WebGLRenderingContext
) => WebGLShader
; downloadByteEncodedFloatMatrixFromOutputTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Float32Array
; downloadFloat32MatrixFromBuffer
: (gl
: WebGLRenderingContext
, buffer
: WebGLBuffer
, size
: number
) => Float32Array
; downloadMatrixFromPackedOutputTexture
: (gl
: WebGLRenderingContext
, physicalRows
: number
, physicalCols
: number
) => Float32Array
; downloadPackedMatrixFromBuffer
: (gl
: WebGLRenderingContext
, buffer
: WebGLBuffer
, batch
: number
, rows
: number
, cols
: number
, physicalRows
: number
, physicalCols
: number
, textureConfig
: TextureConfig
) => Float32Array
; getInternalFormatForFloat16MatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForFloat16PackedMatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForFloat32MatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForPackedMatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForUnsignedBytesMatrixTexture
: (textureConfig
: TextureConfig
) => number
; uploadDenseMatrixToTexture
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, width
: number
, height
: number
, data
: TypedArray
, textureConfig
: TextureConfig
) => void
; uploadPixelDataToTexture
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, pixels
: HTMLCanvasElement
| HTMLVideoElement
| HTMLImageElement
| ImageBitmap
| ImageData
| PixelData
) => void
} ; webgl_util
: { assertNotComplex
: (tensor
: TensorInfo
| TensorInfo
[], opName
: string
) => void
; bindCanvasToFramebuffer
: (gl
: WebGLRenderingContext
) => void
; bindColorTextureToFramebuffer
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, framebuffer
: WebGLFramebuffer
) => void
; bindTextureToProgramUniformSampler
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, uniformSamplerLocation
: WebGLUniformLocation
, textureUnit
: number
) => void
; bindTextureUnit
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, textureUnit
: number
) => void
; bindVertexBufferToProgramAttribute
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, attribute
: string
, buffer
: WebGLBuffer
, arrayEntriesPerItem
: number
, itemStrideInBytes
: number
, itemOffsetInBytes
: number
) => boolean
; callAndCheck
: <T>(gl
: WebGLRenderingContext
, func
: () => T
) => T
; canBeRepresented
: (num
: number
) => boolean
; createFragmentShader
: (gl
: WebGLRenderingContext
, fragmentShaderSource
: string
) => WebGLShader
; createFramebuffer
: (gl
: WebGLRenderingContext
) => WebGLFramebuffer
; createProgram
: (gl
: WebGLRenderingContext
) => WebGLProgram
; createStaticIndexBuffer
: (gl
: WebGLRenderingContext
, data
: Uint16Array
) => WebGLBuffer
; createStaticVertexBuffer
: (gl
: WebGLRenderingContext
, data
: Float32Array
) => WebGLBuffer
; createTexture
: (gl
: WebGLRenderingContext
) => WebGLTexture
; createVertexShader
: (gl
: WebGLRenderingContext
, vertexShaderSource
: string
) => WebGLShader
; getBatchDim
: (shape
: number
[], dimsToSkip?
: number
) => number
; getExtensionOrThrow
: (gl
: WebGLRenderingContext
, extensionName
: string
) => {} ; getFramebufferErrorMessage
: (gl
: WebGLRenderingContext
, status
: number
) => string
; getMaxTexturesInShader
: (webGLVersion
: number
) => number
; getNumChannels
: () => number
; getProgramUniformLocation
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, uniformName
: string
) => WebGLUniformLocation
; getProgramUniformLocationOrThrow
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, uniformName
: string
) => WebGLUniformLocation
; getRowsCols
: (shape
: number
[]) => [number
, number
] ; getShapeAs3D
: (shape
: number
[]) => [number
, number
, number
] ; getTextureShapeFromLogicalShape
: (logShape
: number
[], isPacked?
: boolean
) => [number
, number
] ; getWebGLDisjointQueryTimerVersion
: (webGLVersion
: number
) => number
; getWebGLErrorMessage
: (gl
: WebGLRenderingContext
, status
: number
) => string
; getWebGLMaxTextureSize
: (webGLVersion
: number
) => number
; hasExtension
: (gl
: WebGLRenderingContext
, extensionName
: string
) => boolean
; isCapableOfRenderingToFloatTexture
: (webGLVersion
: number
) => boolean
; isDownloadFloatTextureEnabled
: (webGLVersion
: number
) => boolean
; isReshapeFree
: (shape1
: number
[], shape2
: number
[]) => boolean
; isWebGLFenceEnabled
: (webGLVersion
: number
) => boolean
; isWebGLVersionEnabled
: (webGLVersion
: 2
| 1
) => boolean
; linkProgram
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
) => void
; logShaderSourceAndInfoLog
: (shaderSource
: string
, shaderInfoLog
: string
) => void
; resetMaxTextureSize
: () => void
; resetMaxTexturesInShader
: () => void
; unbindColorTextureFromFramebuffer
: (gl
: WebGLRenderingContext
, framebuffer
: WebGLFramebuffer
) => void
; unbindTextureUnit
: (gl
: WebGLRenderingContext
, textureUnit
: number
) => void
; validateFramebuffer
: (gl
: WebGLRenderingContext
) => void
; validateProgram
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
) => void
; validateTextureSize
: (width
: number
, height
: number
) => void
} ; GPGPUContext
: { constructor
: (gl?
: WebGLRenderingContext
) => GPGPUContext
; addItemToPoll
: any
; bindTextureToFrameBuffer
: any
; bindVertexArray
: (vao
: null
| WebGLVao
) => void
; colorBufferFloatExtension
: {} ; colorBufferHalfFloatExtension
: {} ; createFence
: any
; createVertexArray
: () => null
| WebGLVao
; deleteVertexArray
: (vao
: null
| WebGLVao
) => void
; disjoint
: any
; disjointQueryTimerExtension
: WebGL2DisjointQueryTimerExtension
| WebGL1DisjointQueryTimerExtension
; disposed
: any
; downloadMatrixDriver
: any
; framebuffer
: WebGLFramebuffer
; getQueryTime
: any
; getQueryTimerExtension
: any
; getQueryTimerExtensionWebGL1
: any
; getQueryTimerExtensionWebGL2
: any
; getVertexArray
: () => null
| WebGLVao
; gl
: WebGLRenderingContext
; indexBuffer
: WebGLBuffer
; isQueryAvailable
: any
; itemsToPoll
: any
; outputTexture
: null
| WebGLTexture
; parallelCompilationExtension
: WebGLParallelCompilationExtension
; program
: null
| GPGPUContextProgram
; setOutputMatrixTextureDriver
: any
; setOutputMatrixWriteRegionDriver
: any
; textureConfig
: TextureConfig
; textureFloatExtension
: {} ; textureHalfFloatExtension
: {} ; throwIfDisposed
: any
; throwIfNoProgram
: any
; unbindTextureToFrameBuffer
: any
; vertexBuffer
: WebGLBuffer
; vertexShader
: any
; debug
: ; beginQuery
: () => WebGLQuery
; blockUntilAllProgramsCompleted
: () => void
; buildVao
: (program
: GPGPUContextProgram
) => void
; createAndWaitForFence
: () => Promise
<void
> ; createBufferFromTexture
: (texture
: WebGLTexture
, rows
: number
, columns
: number
) => WebGLBuffer
; createFloat16MatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createFloat16PackedMatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createFloat32MatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createPackedMatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createProgram
: (fragmentShader
: WebGLShader
) => GPGPUContextProgram
; createUnsignedBytesMatrixTexture
: (rows
: number
, columns
: number
) => Texture
; debugValidate
: () => void
; deleteMatrixTexture
: (texture
: WebGLTexture
) => void
; deleteProgram
: (program
: GPGPUContextProgram
) => void
; dispose
: () => void
; downloadByteEncodedFloatMatrixFromOutputTexture
: (texture
: WebGLTexture
, rows
: number
, columns
: number
) => Float32Array
; downloadFloat32MatrixFromBuffer
: (buffer
: WebGLBuffer
, size
: number
) => Float32Array
; downloadMatrixFromPackedTexture
: (texture
: WebGLTexture
, physicalRows
: number
, physicalCols
: number
) => Float32Array
; downloadPackedMatrixFromBuffer
: (buffer
: WebGLBuffer
, batch
: number
, rows
: number
, columns
: number
, physicalRows
: number
, physicalCols
: number
) => Float32Array
; endQuery
: () => void
; executeProgram
: () => void
; getAttributeLocation
: (program
: WebGLProgram
, attribute
: string
) => number
; getUniformLocation
: (program
: WebGLProgram
, uniformName
: string
, shouldThrow?
: boolean
) => WebGLUniformLocation
; getUniformLocationNoThrow
: (program
: WebGLProgram
, uniformName
: string
) => WebGLUniformLocation
; pollFence
: (fenceContext
: FenceContext
) => Promise
<void
> ; pollItems
: () => void
; setInputMatrixTexture
: (inputMatrixTexture
: WebGLTexture
, uniformLocation
: WebGLUniformLocation
, textureUnit
: number
) => void
; setOutputMatrixTexture
: (outputMatrixTexture
: WebGLTexture
, rows
: number
, columns
: number
) => void
; setOutputMatrixWriteRegion
: (startRow
: number
, numRows
: number
, startColumn
: number
, numColumns
: number
) => void
; setOutputPackedMatrixTexture
: (outputPackedMatrixTexture
: WebGLTexture
, rows
: number
, columns
: number
) => void
; setOutputPackedMatrixWriteRegion
: (startRow
: number
, numRows
: number
, startColumn
: number
, numColumns
: number
) => void
; setProgram
: (program
: null
| GPGPUContextProgram
) => void
; uploadDenseMatrixToTexture
: (texture
: WebGLTexture
, width
: number
, height
: number
, data
: TypedArray
) => void
; uploadPixelDataToTexture
: (texture
: WebGLTexture
, pixels
: HTMLCanvasElement
| HTMLImageElement
| ImageBitmap
| ImageData
| PixelData
) => void
; waitForQueryAndGetTime
: (query
: WebGLQuery
) => Promise
<number
> } ; MathBackendWebGL
: { constructor
: (gpuResource?
: HTMLCanvasElement
| OffscreenCanvas
| GPGPUContext
) => MathBackendWebGL
; acquireTexture
: any
; activeTimers
: any
; binaryCache
: any
; canvas
: any
; checkCompletionAsync_
: any
; checkCompletion_
: any
; checkNumericalProblems
: any
; computeBytes
: any
; convertAndCacheOnCPU
: any
; dataRefCount
: WeakMap
<object
, number
> ; decode
: any
; disposed
: any
; downloadWaitMs
: any
; endTimer
: any
; floatPrecisionValue
: any
; getAndSaveBinary
: any
; getQueryTime
: any
; getValuesFromTexture
: any
; gpgpu
: GPGPUContext
; gpgpuCreatedLocally
: any
; lastGlFlushTime
: any
; makeOutput
: any
; nextDataId
: any
; numBytesInGPU
: any
; numMBBeforeWarning
: any
; packedReshape
: any
; packedUnaryOp
: any
; pendingDeletes
: any
; pendingDisposal
: any
; pendingRead
: any
; programTimersStack
: any
; releaseGPUData
: any
; startTimer
: any
; texData
: DataStorage
<TextureData
> ; textureManager
: any
; uploadWaitMs
: any
; warnedAboutMemory
: any
; nextDataId
: any
; abs
: <T>(x
: T
) => T
; bufferSync
: <R, D>(t
: TensorInfo
) => TensorBuffer
<R
, D
> ; checkCompileCompletion
: () => void
; checkCompileCompletionAsync
: () => Promise
<boolean
[]> ; compileAndRun
: (program
: GPGPUProgram
, inputs
: TensorInfo
[], outputDtype?
: keyof DataTypeMap
, customUniformValues?
: number
[][], preventEagerUnpackingOfOutput?
: boolean
) => TensorInfo
; createTensorFromGPUData
: (values
: WebGLData
, shape
: number
[], dtype
: keyof DataTypeMap
) => Tensor
<Rank
> ; decRef
: (dataId
: object
) => void
; dispose
: () => void
; disposeData
: (dataId
: object
, force?
: boolean
) => boolean
; disposeIntermediateTensorInfo
: (tensorInfo
: TensorInfo
) => void
; epsilon
: () => number
; floatPrecision
: () => 16
| 32
; getDataInfo
: (dataId
: object
) => TextureData
; getGPGPUContext
: () => GPGPUContext
; getTexture
: (dataId
: object
) => WebGLTexture
; getTextureManager
: () => TextureManager
; getUniformLocations
: () => void
; incRef
: (dataId
: object
) => void
; makeTensorInfo
: (shape
: number
[], dtype
: keyof DataTypeMap
, values?
: string
[] | BackendValues
) => TensorInfo
; memory
: () => WebGLMemoryInfo
; move
: (dataId
: object
, values
: BackendValues
, shape
: number
[], dtype
: keyof DataTypeMap
, refCount
: number
) => void
; numDataIds
: () => number
; packTensor
: (input
: TensorInfo
) => TensorInfo
; read
: (dataId
: object
) => Promise
<BackendValues
> ; readSync
: (dataId
: object
) => BackendValues
; readToGPU
: (dataId
: object
, options?
: DataToGPUWebGLOption
) => GPUData
; refCount
: (dataId
: object
) => number
; runWebGLProgram
: (program
: GPGPUProgram
, inputs
: TensorInfo
[], outputDtype
: keyof DataTypeMap
, customUniformValues?
: number
[][], preventEagerUnpackingOfOutput?
: boolean
, customTexShape?
: [number
, number
]) => TensorInfo
; shouldExecuteOnCPU
: (inputs
: TensorInfo
[], sizeThreshold?
: number
) => boolean
; time
: (f
: () => void
) => Promise
<WebGLTimingInfo
> ; timerAvailable
: () => boolean
; unpackTensor
: (input
: TensorInfo
) => TensorInfo
; uploadToGPU
: (dataId
: object
) => void
; where
: (condition
: Tensor
<Rank
>) => Tensor2D
; write
: (values
: BackendValues
, shape
: number
[], dtype
: keyof DataTypeMap
) => object
; writeTexture
: (texture
: WebGLTexture
, shape
: number
[], dtype
: keyof DataTypeMap
, texHeight
: number
, texWidth
: number
, channels
: string
) => object
} ; GPGPUProgram
: { customUniforms?
: { arrayIndex?
: number
; name
: string
; type
: UniformType
}[] ; enableShapeUniforms?
: boolean
; outPackingScheme?
: PackingScheme
; outTexUsage?
: TextureUsage
; outputShape
: number
[] ; packedInputs?
: boolean
; packedOutput?
: boolean
; userCode
: string
; variableNames
: string
[] } ; WebGLMemoryInfo
: { numBytes
: number
; numBytesInGPU
: number
; numBytesInGPUAllocated
: number
; numBytesInGPUFree
: number
; numDataBuffers
: number
; numTensors
: number
; reasons
: string
[] ; unreliable
: boolean
} ; WebGLTimingInfo
: { downloadWaitMs
: number
; kernelMs
: number
| { error
: string
} ; uploadWaitMs
: number
; wallMs
: number
; getExtraProfileInfo?
: () => string
} ; version_webgl
: "4.13.0"
= "4.13.0"; webgl
: { forceHalfFloat
: typeof forceHalfFloat
} ; forceHalfFloat
: () => void
; setWebGLContext
: (webGLVersion
: number
, gl
: WebGLRenderingContext
) => void
} ; webgl
: { forceHalfFloat
: () => void
} ; forceHalfFloat
: () => void
}>
Returns
Promise
<{ GPGPUContext
: ; MathBackendWebGL
: ; gpgpu_util
: ; setWebGLContext
: ; version_webgl
: ; webgl_util
: ; default
: { gpgpu_util
: { bindVertexProgramAttributeStreams
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, vertexBuffer
: WebGLBuffer
) => boolean
; createBufferFromOutputTexture
: (gl2
: WebGL2RenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => WebGLBuffer
; createFloat16MatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createFloat16PackedMatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createFloat32MatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createIndexBuffer
: (gl
: WebGLRenderingContext
) => WebGLBuffer
; createPackedMatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createUnsignedBytesMatrixTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Texture
; createVertexBuffer
: (gl
: WebGLRenderingContext
) => WebGLBuffer
; createVertexShader
: (gl
: WebGLRenderingContext
) => WebGLShader
; downloadByteEncodedFloatMatrixFromOutputTexture
: (gl
: WebGLRenderingContext
, rows
: number
, columns
: number
, textureConfig
: TextureConfig
) => Float32Array
; downloadFloat32MatrixFromBuffer
: (gl
: WebGLRenderingContext
, buffer
: WebGLBuffer
, size
: number
) => Float32Array
; downloadMatrixFromPackedOutputTexture
: (gl
: WebGLRenderingContext
, physicalRows
: number
, physicalCols
: number
) => Float32Array
; downloadPackedMatrixFromBuffer
: (gl
: WebGLRenderingContext
, buffer
: WebGLBuffer
, batch
: number
, rows
: number
, cols
: number
, physicalRows
: number
, physicalCols
: number
, textureConfig
: TextureConfig
) => Float32Array
; getInternalFormatForFloat16MatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForFloat16PackedMatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForFloat32MatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForPackedMatrixTexture
: (textureConfig
: TextureConfig
) => number
; getInternalFormatForUnsignedBytesMatrixTexture
: (textureConfig
: TextureConfig
) => number
; uploadDenseMatrixToTexture
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, width
: number
, height
: number
, data
: TypedArray
, textureConfig
: TextureConfig
) => void
; uploadPixelDataToTexture
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, pixels
: HTMLCanvasElement
| HTMLVideoElement
| HTMLImageElement
| ImageBitmap
| ImageData
| PixelData
) => void
} ; webgl_util
: { assertNotComplex
: (tensor
: TensorInfo
| TensorInfo
[], opName
: string
) => void
; bindCanvasToFramebuffer
: (gl
: WebGLRenderingContext
) => void
; bindColorTextureToFramebuffer
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, framebuffer
: WebGLFramebuffer
) => void
; bindTextureToProgramUniformSampler
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, uniformSamplerLocation
: WebGLUniformLocation
, textureUnit
: number
) => void
; bindTextureUnit
: (gl
: WebGLRenderingContext
, texture
: WebGLTexture
, textureUnit
: number
) => void
; bindVertexBufferToProgramAttribute
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, attribute
: string
, buffer
: WebGLBuffer
, arrayEntriesPerItem
: number
, itemStrideInBytes
: number
, itemOffsetInBytes
: number
) => boolean
; callAndCheck
: <T>(gl
: WebGLRenderingContext
, func
: () => T
) => T
; canBeRepresented
: (num
: number
) => boolean
; createFragmentShader
: (gl
: WebGLRenderingContext
, fragmentShaderSource
: string
) => WebGLShader
; createFramebuffer
: (gl
: WebGLRenderingContext
) => WebGLFramebuffer
; createProgram
: (gl
: WebGLRenderingContext
) => WebGLProgram
; createStaticIndexBuffer
: (gl
: WebGLRenderingContext
, data
: Uint16Array
) => WebGLBuffer
; createStaticVertexBuffer
: (gl
: WebGLRenderingContext
, data
: Float32Array
) => WebGLBuffer
; createTexture
: (gl
: WebGLRenderingContext
) => WebGLTexture
; createVertexShader
: (gl
: WebGLRenderingContext
, vertexShaderSource
: string
) => WebGLShader
; getBatchDim
: (shape
: number
[], dimsToSkip?
: number
) => number
; getExtensionOrThrow
: (gl
: WebGLRenderingContext
, extensionName
: string
) => {} ; getFramebufferErrorMessage
: (gl
: WebGLRenderingContext
, status
: number
) => string
; getMaxTexturesInShader
: (webGLVersion
: number
) => number
; getNumChannels
: () => number
; getProgramUniformLocation
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, uniformName
: string
) => WebGLUniformLocation
; getProgramUniformLocationOrThrow
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
, uniformName
: string
) => WebGLUniformLocation
; getRowsCols
: (shape
: number
[]) => [number
, number
] ; getShapeAs3D
: (shape
: number
[]) => [number
, number
, number
] ; getTextureShapeFromLogicalShape
: (logShape
: number
[], isPacked?
: boolean
) => [number
, number
] ; getWebGLDisjointQueryTimerVersion
: (webGLVersion
: number
) => number
; getWebGLErrorMessage
: (gl
: WebGLRenderingContext
, status
: number
) => string
; getWebGLMaxTextureSize
: (webGLVersion
: number
) => number
; hasExtension
: (gl
: WebGLRenderingContext
, extensionName
: string
) => boolean
; isCapableOfRenderingToFloatTexture
: (webGLVersion
: number
) => boolean
; isDownloadFloatTextureEnabled
: (webGLVersion
: number
) => boolean
; isReshapeFree
: (shape1
: number
[], shape2
: number
[]) => boolean
; isWebGLFenceEnabled
: (webGLVersion
: number
) => boolean
; isWebGLVersionEnabled
: (webGLVersion
: 2
| 1
) => boolean
; linkProgram
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
) => void
; logShaderSourceAndInfoLog
: (shaderSource
: string
, shaderInfoLog
: string
) => void
; resetMaxTextureSize
: () => void
; resetMaxTexturesInShader
: () => void
; unbindColorTextureFromFramebuffer
: (gl
: WebGLRenderingContext
, framebuffer
: WebGLFramebuffer
) => void
; unbindTextureUnit
: (gl
: WebGLRenderingContext
, textureUnit
: number
) => void
; validateFramebuffer
: (gl
: WebGLRenderingContext
) => void
; validateProgram
: (gl
: WebGLRenderingContext
, program
: WebGLProgram
) => void
; validateTextureSize
: (width
: number
, height
: number
) => void
} ; GPGPUContext
: { constructor
: (gl?
: WebGLRenderingContext
) => GPGPUContext
; addItemToPoll
: any
; bindTextureToFrameBuffer
: any
; bindVertexArray
: (vao
: null
| WebGLVao
) => void
; colorBufferFloatExtension
: {} ; colorBufferHalfFloatExtension
: {} ; createFence
: any
; createVertexArray
: () => null
| WebGLVao
; deleteVertexArray
: (vao
: null
| WebGLVao
) => void
; disjoint
: any
; disjointQueryTimerExtension
: WebGL2DisjointQueryTimerExtension
| WebGL1DisjointQueryTimerExtension
; disposed
: any
; downloadMatrixDriver
: any
; framebuffer
: WebGLFramebuffer
; getQueryTime
: any
; getQueryTimerExtension
: any
; getQueryTimerExtensionWebGL1
: any
; getQueryTimerExtensionWebGL2
: any
; getVertexArray
: () => null
| WebGLVao
; gl
: WebGLRenderingContext
; indexBuffer
: WebGLBuffer
; isQueryAvailable
: any
; itemsToPoll
: any
; outputTexture
: null
| WebGLTexture
; parallelCompilationExtension
: WebGLParallelCompilationExtension
; program
: null
| GPGPUContextProgram
; setOutputMatrixTextureDriver
: any
; setOutputMatrixWriteRegionDriver
: any
; textureConfig
: TextureConfig
; textureFloatExtension
: {} ; textureHalfFloatExtension
: {} ; throwIfDisposed
: any
; throwIfNoProgram
: any
; unbindTextureToFrameBuffer
: any
; vertexBuffer
: WebGLBuffer
; vertexShader
: any
; debug
: ; beginQuery
: () => WebGLQuery
; blockUntilAllProgramsCompleted
: () => void
; buildVao
: (program
: GPGPUContextProgram
) => void
; createAndWaitForFence
: () => Promise
<void
> ; createBufferFromTexture
: (texture
: WebGLTexture
, rows
: number
, columns
: number
) => WebGLBuffer
; createFloat16MatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createFloat16PackedMatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createFloat32MatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createPackedMatrixTexture
: (rows
: number
, columns
: number
) => Texture
; createProgram
: (fragmentShader
: WebGLShader
) => GPGPUContextProgram
; createUnsignedBytesMatrixTexture
: (rows
: number
, columns
: number
) => Texture
; debugValidate
: () => void
; deleteMatrixTexture
: (texture
: WebGLTexture
) => void
; deleteProgram
: (program
: GPGPUContextProgram
) => void
; dispose
: () => void
; downloadByteEncodedFloatMatrixFromOutputTexture
: (texture
: WebGLTexture
, rows
: number
, columns
: number
) => Float32Array
; downloadFloat32MatrixFromBuffer
: (buffer
: WebGLBuffer
, size
: number
) => Float32Array
; downloadMatrixFromPackedTexture
: (texture
: WebGLTexture
, physicalRows
: number
, physicalCols
: number
) => Float32Array
; downloadPackedMatrixFromBuffer
: (buffer
: WebGLBuffer
, batch
: number
, rows
: number
, columns
: number
, physicalRows
: number
, physicalCols
: number
) => Float32Array
; endQuery
: () => void
; executeProgram
: () => void
; getAttributeLocation
: (program
: WebGLProgram
, attribute
: string
) => number
; getUniformLocation
: (program
: WebGLProgram
, uniformName
: string
, shouldThrow?
: boolean
) => WebGLUniformLocation
; getUniformLocationNoThrow
: (program
: WebGLProgram
, uniformName
: string
) => WebGLUniformLocation
; pollFence
: (fenceContext
: FenceContext
) => Promise
<void
> ; pollItems
: () => void
; setInputMatrixTexture
: (inputMatrixTexture
: WebGLTexture
, uniformLocation
: WebGLUniformLocation
, textureUnit
: number
) => void
; setOutputMatrixTexture
: (outputMatrixTexture
: WebGLTexture
, rows
: number
, columns
: number
) => void
; setOutputMatrixWriteRegion
: (startRow
: number
, numRows
: number
, startColumn
: number
, numColumns
: number
) => void
; setOutputPackedMatrixTexture
: (outputPackedMatrixTexture
: WebGLTexture
, rows
: number
, columns
: number
) => void
; setOutputPackedMatrixWriteRegion
: (startRow
: number
, numRows
: number
, startColumn
: number
, numColumns
: number
) => void
; setProgram
: (program
: null
| GPGPUContextProgram
) => void
; uploadDenseMatrixToTexture
: (texture
: WebGLTexture
, width
: number
, height
: number
, data
: TypedArray
) => void
; uploadPixelDataToTexture
: (texture
: WebGLTexture
, pixels
: HTMLCanvasElement
| HTMLImageElement
| ImageBitmap
| ImageData
| PixelData
) => void
; waitForQueryAndGetTime
: (query
: WebGLQuery
) => Promise
<number
> } ; MathBackendWebGL
: { constructor
: (gpuResource?
: HTMLCanvasElement
| OffscreenCanvas
| GPGPUContext
) => MathBackendWebGL
; acquireTexture
: any
; activeTimers
: any
; binaryCache
: any
; canvas
: any
; checkCompletionAsync_
: any
; checkCompletion_
: any
; checkNumericalProblems
: any
; computeBytes
: any
; convertAndCacheOnCPU
: any
; dataRefCount
: WeakMap
<object
, number
> ; decode
: any
; disposed
: any
; downloadWaitMs
: any
; endTimer
: any
; floatPrecisionValue
: any
; getAndSaveBinary
: any
; getQueryTime
: any
; getValuesFromTexture
: any
; gpgpu
: GPGPUContext
; gpgpuCreatedLocally
: any
; lastGlFlushTime
: any
; makeOutput
: any
; nextDataId
: any
; numBytesInGPU
: any
; numMBBeforeWarning
: any
; packedReshape
: any
; packedUnaryOp
: any
; pendingDeletes
: any
; pendingDisposal
: any
; pendingRead
: any
; programTimersStack
: any
; releaseGPUData
: any
; startTimer
: any
; texData
: DataStorage
<TextureData
> ; textureManager
: any
; uploadWaitMs
: any
; warnedAboutMemory
: any
; nextDataId
: any
; abs
: <T>(x
: T
) => T
; bufferSync
: <R, D>(t
: TensorInfo
) => TensorBuffer
<R
, D
> ; checkCompileCompletion
: () => void
; checkCompileCompletionAsync
: () => Promise
<boolean
[]> ; compileAndRun
: (program
: GPGPUProgram
, inputs
: TensorInfo
[], outputDtype?
: keyof DataTypeMap
, customUniformValues?
: number
[][], preventEagerUnpackingOfOutput?
: boolean
) => TensorInfo
; createTensorFromGPUData
: (values
: WebGLData
, shape
: number
[], dtype
: keyof DataTypeMap
) => Tensor
<Rank
> ; decRef
: (dataId
: object
) => void
; dispose
: () => void
; disposeData
: (dataId
: object
, force?
: boolean
) => boolean
; disposeIntermediateTensorInfo
: (tensorInfo
: TensorInfo
) => void
; epsilon
: () => number
; floatPrecision
: () => 16
| 32
; getDataInfo
: (dataId
: object
) => TextureData
; getGPGPUContext
: () => GPGPUContext
; getTexture
: (dataId
: object
) => WebGLTexture
; getTextureManager
: () => TextureManager
; getUniformLocations
: () => void
; incRef
: (dataId
: object
) => void
; makeTensorInfo
: (shape
: number
[], dtype
: keyof DataTypeMap
, values?
: string
[] | BackendValues
) => TensorInfo
; memory
: () => WebGLMemoryInfo
; move
: (dataId
: object
, values
: BackendValues
, shape
: number
[], dtype
: keyof DataTypeMap
, refCount
: number
) => void
; numDataIds
: () => number
; packTensor
: (input
: TensorInfo
) => TensorInfo
; read
: (dataId
: object
) => Promise
<BackendValues
> ; readSync
: (dataId
: object
) => BackendValues
; readToGPU
: (dataId
: object
, options?
: DataToGPUWebGLOption
) => GPUData
; refCount
: (dataId
: object
) => number
; runWebGLProgram
: (program
: GPGPUProgram
, inputs
: TensorInfo
[], outputDtype
: keyof DataTypeMap
, customUniformValues?
: number
[][], preventEagerUnpackingOfOutput?
: boolean
, customTexShape?
: [number
, number
]) => TensorInfo
; shouldExecuteOnCPU
: (inputs
: TensorInfo
[], sizeThreshold?
: number
) => boolean
; time
: (f
: () => void
) => Promise
<WebGLTimingInfo
> ; timerAvailable
: () => boolean
; unpackTensor
: (input
: TensorInfo
) => TensorInfo
; uploadToGPU
: (dataId
: object
) => void
; where
: (condition
: Tensor
<Rank
>) => Tensor2D
; write
: (values
: BackendValues
, shape
: number
[], dtype
: keyof DataTypeMap
) => object
; writeTexture
: (texture
: WebGLTexture
, shape
: number
[], dtype
: keyof DataTypeMap
, texHeight
: number
, texWidth
: number
, channels
: string
) => object
} ; GPGPUProgram
: { customUniforms?
: { arrayIndex?
: number
; name
: string
; type
: UniformType
}[] ; enableShapeUniforms?
: boolean
; outPackingScheme?
: PackingScheme
; outTexUsage?
: TextureUsage
; outputShape
: number
[] ; packedInputs?
: boolean
; packedOutput?
: boolean
; userCode
: string
; variableNames
: string
[] } ; WebGLMemoryInfo
: { numBytes
: number
; numBytesInGPU
: number
; numBytesInGPUAllocated
: number
; numBytesInGPUFree
: number
; numDataBuffers
: number
; numTensors
: number
; reasons
: string
[] ; unreliable
: boolean
} ; WebGLTimingInfo
: { downloadWaitMs
: number
; kernelMs
: number
| { error
: string
} ; uploadWaitMs
: number
; wallMs
: number
; getExtraProfileInfo?
: () => string
} ; version_webgl
: "4.13.0"
= "4.13.0"; webgl
: { forceHalfFloat
: typeof forceHalfFloat
} ; forceHalfFloat
: () => void
; setWebGLContext
: (webGLVersion
: number
, gl
: WebGLRenderingContext
) => void
} ; webgl
: { forceHalfFloat
: () => void
} ; forceHalfFloat
: () => void
}>
loadTfjsCore
▸ loadTfjsCore(prodMode
): Promise
<unknown
>
Parameters
Name | Type |
---|---|
prodMode | boolean |
Returns
Promise
<unknown
>
loadWasms
▸ loadWasms(paths
): Promise
<void
>
Parameters
Name | Type |
---|---|
paths | WasmPaths [] |
Returns
Promise
<void
>
pow
▸ pow(exponent
): (base
: number
) => number
pow function from Math in functional form number -> number -> number
Parameters
Name | Type | Description |
---|---|---|
exponent | number | The exponent used for the expression |
Returns
fn
Math.pow(base, exponent)
▸ (base
): number
Parameters
Name | Type |
---|---|
base | number |
Returns
number
processAverageVolume
▸ processAverageVolume(data
): number
Calculate the averaged volume using Root Mean Square, assuming the data is in float form
Parameters
Name | Type | Description |
---|---|---|
data | number [] | Audio Frequency data |
Returns
number
resumeAudioOnInterruption
▸ resumeAudioOnInterruption(audioContext
): () => void
Resume the stream whenever interrupted
Parameters
Name | Type | Description |
---|---|---|
audioContext | AudioContext | AudioContext |
Returns
fn
▸ (): void
Returns
void
resumeAudioOnUnmute
▸ resumeAudioOnUnmute(context
): (track
: MediaStreamTrack
) => Unsubscribe
Resume the AudioContext whenever the source track is unmuted
Parameters
Name | Type |
---|---|
context | AudioContext |
Returns
fn
▸ (track
): Unsubscribe
Parameters
Name | Type | Description |
---|---|---|
track | MediaStreamTrack | The source track to listen on the unmute event |
Returns
rms
▸ rms(nums
): number
Calculate the Root Mean Square from provided numbers
Parameters
Name | Type | Description |
---|---|---|
nums | number [] | An array of numbers |
Returns
number
round
▸ round(num
): number
Round the floating point number away from zero, which is different from
Math.round
Example
round(0.5) // 1
round(-0.5) // -1
Parameters
Name | Type | Description |
---|---|---|
num | number | The number to round |
Returns
number
subscribeTimeoutAnalyzerNode
▸ subscribeTimeoutAnalyzerNode(analyzer
, options
): () => void
Subscribe to a timeout loop to get the data from Analyzer
Parameters
Name | Type | Description |
---|---|---|
analyzer | Analyzer | the analyzer to subscribe |
options | AnalyzerSubscribableOptions | message handler, etc. |
Returns
fn
▸ (): void
Returns
void
subscribeWorkletNode
▸ subscribeWorkletNode<T
>(workletNode
, options?
): () => void
Subscribe MessagePort message from an AudioWorkletNode
Type parameters
Name |
---|
T |
Parameters
Name | Type | Description |
---|---|---|
workletNode | AudioWorkletNode | the node to subscribe |
options | Partial <WorkletMessagePortOptions <T >> | can pass a message handler here to handle the message |
Returns
fn
▸ (): void
Returns
void
sum
▸ sum(nums
): number
Sum an array of numbers
Parameters
Name | Type | Description |
---|---|---|
nums | number [] | An array of numbers |
Returns
number
toDecibel
▸ toDecibel(gain
): number
Convert a floating point gain value into a dB representation without any reference, dBFS, https://en.wikipedia.org/wiki/DBFS
See https://www.w3.org/TR/webaudio#conversion-to-db
Parameters
Name | Type |
---|---|
gain | number |
Returns
number